Official DX Optimized Python SDK for Honcho
Project description
Honcho Python SDK
The official Python library for the Honcho conversational memory platform. Honcho provides tools for managing peers, sessions, and conversation context across multi-party interactions, enabling advanced conversational AI applications with persistent memory and theory-of-mind capabilities.
Installation
pip install honcho-ai
Quick Start
from honcho import Honcho
# Initialize client
client = Honcho(api_key="your-api-key")
# Create peers (participants in conversations)
alice = client.peer("alice")
bob = client.peer("bob")
# Create a session for group conversations
session = client.session("conversation-1")
# Add messages to the session
session.add_messages([
alice.message("Hello, Bob!"),
bob.message("Hi Alice, how are you?")
])
# Query conversation context
response = alice.chat("What did Bob say to the user?")
print(response)
Core Concepts
Peers
Peers represent participants in conversations.
# Create peers
assistant = client.peer("assistant")
user = client.peer("user-123")
# Chat with global context
response = user.chat("What did I talk about yesterday?")
# Chat with perspective of another peer
response = user.chat("Does the assistant know my preferences?", target=assistant)
Sessions
Sessions group related conversations and messages:
# Create a session
session = client.session("project-discussion")
# Add peers to session
session.add_peers([alice, bob])
# Add messages
session.add_messages([
alice.message("Let's discuss the project timeline"),
bob.message("I think we need two more weeks")
])
# Get conversation context
context = session.context()
Messages and Context
Retrieve and use conversation history:
# Get messages from a session
messages = session.messages()
# Convert to OpenAI format for further prompting
openai_messages = context.to_openai(assistant="assistant")
# Convert to Anthropic format for further prompting
anthropic_messages = context.to_anthropic(assistant="assistant")
Async Support
The SDK provides async access via the .aio accessor on any instance:
from honcho import Honcho
async def main():
client = Honcho(api_key="your-api-key")
# Async peer and session creation
peer = await client.aio.peer("user-123")
session = await client.aio.session("conversation-1")
# Async chat
response = await peer.aio.chat("What does this user prefer?")
# Async iteration
async for p in client.aio.peers():
print(p.id)
Metadata Management
# Set peer metadata
user.set_metadata({"location": "San Francisco", "preferences": {"theme": "dark"}})
# Session metadata
session.set_metadata({"topic": "project-planning", "priority": "high"})
Multi-Perspective Queries
# Alice's view of what Bob knows
response = alice.chat("Does Bob remember our discussion about the budget?", target=bob)
# Session-specific perspective
response = alice.chat("What does Bob think about this project?",
target=bob,
session=session)
Configuration
Environment Variables
export HONCHO_API_KEY="your-api-key"
export HONCHO_BASE_URL="https://api.honcho.dev" # Optional
export HONCHO_WORKSPACE_ID="your-workspace" # Optional
Client Options
client = Honcho(
api_key="your-api-key",
environment="production", # or "local"
workspace_id="custom-workspace",
base_url="https://api.honcho.dev"
)
License
Apache 2.0 - see LICENSE for details.
Support
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file honcho_ai-2.1.1.tar.gz.
File metadata
- Download URL: honcho_ai-2.1.1.tar.gz
- Upload date:
- Size: 47.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.24 {"installer":{"name":"uv","version":"0.9.24","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d273d86ce3e7361755c084b0ddc44160d49c59c538b278df606e6437bd893a61
|
|
| MD5 |
5f2f566127da2523f799c991c2bd2010
|
|
| BLAKE2b-256 |
bd1823baf5c2bd256cf25def4beb64af1e5b38f6ac0934d0f8d7348d279f217f
|
File details
Details for the file honcho_ai-2.1.1-py3-none-any.whl.
File metadata
- Download URL: honcho_ai-2.1.1-py3-none-any.whl
- Upload date:
- Size: 58.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.24 {"installer":{"name":"uv","version":"0.9.24","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a6b57b359f42b6f737aecc6ff0f00dd5d3ccf7327a3b2c3c6c1ada8cfa6ef505
|
|
| MD5 |
f553bc9e467e6d9692518dcda1457827
|
|
| BLAKE2b-256 |
1838aa0a1e2eee4549ab7700605ab2797acb8c267265dd19c650ba976f96e43a
|